Search results

1 – 10 of 15
Article
Publication date: 8 December 2017

Donald V. Widener, Thomas A. Mazzuchi and Shahram Sarkani

The purpose of this paper is to propose an effective knowledge elicitation method and representation scheme that empowers humanitarian assistance/disaster relief (HA/DR) analysts…

Abstract

Purpose

The purpose of this paper is to propose an effective knowledge elicitation method and representation scheme that empowers humanitarian assistance/disaster relief (HA/DR) analysts and experts to create analytic models without the aid of data scientists and methodologists while addressing the issues of complexity, collaboration, and emerging technology across a diverse global network of HA/DR organizations.

Design/methodology/approach

The paper used a mixed-methods research approach, with qualitative research and analysis to select the model elicitation method, followed by quantitative data collection and evaluation to test the representation scheme. A simplified analytic modeling approach was created based on emerging activity-based intelligence (ABI) analytic methods.

Findings

Using open source data on the Syrian humanitarian crisis as the reference mission, ABI analytic models were proven capable in modeling HA/DR scenarios of physical systems, nonphysical systems, and thinking.

Practical implications

As a data-agnostic approach to develop object and network knowledge, ABI aligns with the objectives of modeling within multiple HA/DR organizations.

Originality/value

Using an analytic method as the basis for model creation allows for immediate adoption by analysts and removes the need for data scientists and methodologists in the elicitation phase. Applying this highly effective cross-domain ABI data fusion technique should also supplant the accuracy weaknesses created by traditional simplified analytic models.

Details

Disaster Prevention and Management, vol. 27 no. 1
Type: Research Article
ISSN: 0965-3562

Keywords

Article
Publication date: 30 August 2021

Omar Taha, Thomas A. Mazzuchi, Shahram Sarkani, Jiju Antony and Sandra Furterer

The purpose of this paper is to apply Lean in the workers’ compensation industry. It focuses on identifying patterns of repetitive non-value-added transnational activities for…

Abstract

Purpose

The purpose of this paper is to apply Lean in the workers’ compensation industry. It focuses on identifying patterns of repetitive non-value-added transnational activities for physical-therapy patients and healthcare providers. It addresses the research gap in this field.

Design/methodology/approach

In this study, we designed and deployed multiple case studies to better understand the journey of an injured worker within the worker compensation system in the United States of America. We partnered with Concentra Inc., a leading national healthcare provider in the field of workers’ compensation having 520 medical centers in 44 states. Both case studies included conducting direct observations, Gemba walk, in five clinics in two states: Florida and Pennsylvania. We analyzed the data of 263 injured workers with 8 or more physical therapy visits who got admitted to Concentra clinics in both states over the period of 31 days.

Findings

The results revealed that the time intervals at which activities associated with physical therapy treatment pre-authorization accounted for 91.59% of the total non-value-added activities and are thus the key administrative factor leading to process inefficiency in the state of Florida. The Process Cycle Efficiency of Pennsylvania was 75.36% compared to 53.16% of Florida. The injured workers in Florida needed 39.58 days on average to complete eight physical therapy visits compared to 27.92 days in Pennsylvania (a median of 34.09 vs 22.15 days).

Research limitations/implications

This study is limited as it only focuses on processes on the healthcare provider side. An expanded value stream map that includes the treatment pre-authorization process on the insurance side would be beneficial for generating more potential solutions to streamline the process.

Practical implications

This study shows that Lean could play a critical role in identifying and quantifying continuous improvement opportunities that could accelerate patient’s treatment, reduce administrative burden on healthcare providers and improve the overall claim cost of insurance companies. It provides data-driven argument for insurance companies to consider eliminating physical therapy pre-authorization.

Originality/value

This is the first study to apply Lean methodology in the workers’ compensation field.

Details

The TQM Journal, vol. 34 no. 5
Type: Research Article
ISSN: 1754-2731

Keywords

Article
Publication date: 31 July 2009

Donald E. Hutto, Thomas Mazzuchi and Shahram Sarkani

The purpose of this paper is to provide maintenance personnel with a methodology for using masked field reliability data to determine the probability of each subassembly failure.

1995

Abstract

Purpose

The purpose of this paper is to provide maintenance personnel with a methodology for using masked field reliability data to determine the probability of each subassembly failure.

Design/methodology/approach

The paper compares an iterative maximum likelihood estimation method and a Bayesian methodology for handling masked data collected from 227 identical radar power supplies. The power supply consists of several subassemblies hereafter referred to as shop replaceable assemblies (SRAs).

Findings

The study examined two approaches for dealing with masking, an iterative maximum likelihood estimate procedure, IMLEP, and a Bayesian approach implemented with the application WinBUGS. It indicates that the performances of IMLEP and WinBUGS in estimating the parameters of the SRA distribution under no masking conditions are similar. IMLEP and WinBUGS also provide similar results under masking conditions. However, the study indicates that WinBUGS may perform better than IMLEP when the competing risk responsible for a failure represents a smaller total percentage of the total failures. Future study to confirm this conclusion by expanding the number of SRAs into which the item under study is organized is required.

Research limitations/implications

If an item is considered to be comprised of various subassemblies and the failure of the first subassembly causes the item to fail, then the item is referred to as a series system in the literature. If the probability of a each subassembly failure is statistically independent then the item can be represented by a competing risk model and the probability distributions of the subassemblies can be ascertained from the item's failure data. When the item's cause of failure is not known, the data are referred to in the literature as being masked. Since competing risk theory requires a cause of failure and a time of failure, any masked data must be addressed in the competing risk model.

Practical implications

This study indicates that competing risk theory can be applied to the equipment field failure data to determine a SRA's probability of failure and thereby provide an efficient sequence of replacing suspect failed SRAs.

Originality/value

The analysis of masked failure data is an important area that has had only limited study in the literature due to the availability of failure data. This paper contributes to the research by providing the complete historical equipment usage data for the item under study gathered over a time frame of approximately seven years.

Details

International Journal of Quality & Reliability Management, vol. 26 no. 7
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 19 April 2013

Paul Blessner, Thomas A. Mazzuchi and Shahram Sarkani

The purpose of this paper is to provide insights into the relationship between ISO 9001 conformance of suppliers and the quality of products they provide, within a procurement…

1546

Abstract

Purpose

The purpose of this paper is to provide insights into the relationship between ISO 9001 conformance of suppliers and the quality of products they provide, within a procurement system of a manufacturer operating under contracts with the US Department of Defense.

Design/methodology/approach

Chi‐square tests of independence were performed to compare the receipt acceptance rate of material provided by ISO 9001‐conforming suppliers to that of non‐ISO 9001‐conforming suppliers, for more than 46,000 receipts representing 21 material commodity groups provided by almost 800 suppliers. Acceptance of receipts required conformance to both hardware and paperwork requirements. Tests were also performed on data subsets, to determine the impact of ISO 9001 conformance on product quality for each of the 21 material commodity groups, for manufacturers versus distributors, and for two material control levels.

Findings

For the overall data set, and for the majority of data subsets analyzed, the product quality of non‐ISO 9001‐conforming suppliers was significantly better than that of ISO 9001‐conforming suppliers. When only hardware non‐conformances were considered to cause rejections, the results were similar, but effect sizes were generally smaller.

Research limitations/implications

The quantities of receipts and suppliers included in this investigation were very large; however, care should be exercised in generalizing the results, because of the potential influence of the defense industry‐related requirements imposed upon the material and the suppliers.

Originality/value

This is believed to be the first paper to investigate the impact of ISO 9001 conformance on product quality using a large quantity of actual product data, for both ISO 9001‐conforming and non‐ISO 9001‐conforming suppliers, in contrast to numerous assessments of quality impact performed using interview and survey data.

Details

The TQM Journal, vol. 25 no. 3
Type: Research Article
ISSN: 1754-2731

Keywords

Article
Publication date: 1 July 2006

Hsin Kao, Peng‐Hsian Kao and Thomas A. Mazzuchi

Many scholars and practitioners argue about the Taiwanese government policy of investing in “Go‐South” or “Go‐West” approach. Therefore, this paper aims to present data from two

1252

Abstract

Purpose

Many scholars and practitioners argue about the Taiwanese government policy of investing in “Go‐South” or “Go‐West” approach. Therefore, this paper aims to present data from two groups showing the differences of Taiwanese executives in China and in Malaysia from the point of view of knowledge management (KM) usage.

Design/methodology/approach

Knowledge management is very important since enterprises are eager to create value through the better use of knowledge in today's globalization trend. In this study, the State of Knowledge Management: An Assessment Questionnaire, which includes 19 KM tools, was used to identify various kinds of KM tools usage frequencies. Data were collected from 200 firms in China and Malaysia. These firms represent several manufacturing industries, including food, textiles, rubbers and plastics, electronics, and metal manufacturing.

Findings

The results show that executives in China have higher scores in 16 KM tools than executives in Malaysia, which means the former executives practice KM more than the later executives. Chinese people in both Taiwan and China share the same traditional values of respect for age, authority, hierarchy, culture, and language. Therefore, cultural and contextual variables differences may not affect the mix of knowledge‐sharing problems.

Originality/value

This paper reveals useful information showing the differences of Taiwanese executives in China and in Malaysia from the point of view of KM usage.

Details

VINE, vol. 36 no. 3
Type: Research Article
ISSN: 0305-5728

Keywords

Article
Publication date: 19 July 2011

Jacqueline H. Hall, Shahram Sarkani and Thomas A. Mazzuchi

This research aims to examine the relationship between information security strategy and organization performance, with organizational capabilities as important factors…

3427

Abstract

Purpose

This research aims to examine the relationship between information security strategy and organization performance, with organizational capabilities as important factors influencing successful implementation of information security strategy and organization performance.

Design/methodology/approach

Based on existing literature in strategic management and information security, a theoretical model was proposed and validated. A self‐administered survey instrument was developed to collect empirical data. Structural equation modeling was used to test hypotheses and to fit the theoretical model.

Findings

Evidence suggests that organizational capabilities, encompassing the ability to develop high‐quality situational awareness of the current and future threat environment, the ability to possess appropriate means, and the ability to orchestrate the means to respond to information security threats, are positively associated with effective implementation of information security strategy, which in turn positively affects organization performance. However, there is no significant relationship between decision making and information security strategy implementation success.

Research limitations/implications

The study provides a starting point for further research on the role of decision‐making in information security.

Practical implications

Findings are expected to yield practical value for business leaders in understanding the viable predisposition of organizational capabilities in the context of information security, thus enabling firms to focus on acquiring the ones indispensable for improving organization performance.

Originality/value

This study provides the body of knowledge with an empirical analysis of organization's information security capabilities as an aggregation of sense making, decision‐making, asset availability, and operations management constructs.

Article
Publication date: 14 August 2017

Philip William Sisson and Julie J.C.H. Ryan

This paper aims to clarify the need for Chief Knowledge Officers (CKOs) and explain how some recent views on competencies for educational guidelines, a Knowledge Management (KM…

Abstract

Purpose

This paper aims to clarify the need for Chief Knowledge Officers (CKOs) and explain how some recent views on competencies for educational guidelines, a Knowledge Management (KM) competency model and expansion of practice management concepts make the need for CKOs clearer.

Design/methodology/approach

This viewpoint was developed in response to recent publications disparaging the idea of a CKO. The method used was to extract ideas from published and in-work papers to establish the basis for and explain the postulated Unified Competency Theory of KM and its implications regarding the need for CKOs.

Findings

CKOs are needed to ensure that all organizationally relevant functions’ knowledge and KM assessments and/or audits are individually complete and collectively sufficient. A risk/opportunity management role also provides justification.

Research limitations/implications

This paper mainly limits its discussion to the papers that comprise research leading to the Unified Competency Theory of KM, its implications and an updated practice management model. Other points of view that might substantiate or refute the conclusions have not been addressed.

Practical implications

The KM field needs to better identify KM’s risk and opportunity management role and functional imperative. Organizations may need to reevaluate their directions with regards to KM and a CKO.

Originality/value

It extends the concept of practice management to permit differentiating disciplines. It provides new rationale for CKOs.

Details

VINE Journal of Information and Knowledge Management Systems, vol. 47 no. 3
Type: Research Article
ISSN: 2059-5891

Keywords

Content available
Book part
Publication date: 6 April 2018

Michael Stankosky and Carolyn R. Baldanza

Abstract

Details

21 for 21
Type: Book
ISBN: 978-1-78743-787-6

Content available
Article
Publication date: 1 July 2006

Michael Stankosky

187

Abstract

Details

VINE, vol. 36 no. 3
Type: Research Article
ISSN: 0305-5728

Article
Publication date: 30 October 2019

Vibha Verma, Sameer Anand and Anu Gupta Aggarwal

The purpose of this paper is to identify and quantify the key components of the overall cost of software development when warranty coverage is given by a developer. Also, the…

Abstract

Purpose

The purpose of this paper is to identify and quantify the key components of the overall cost of software development when warranty coverage is given by a developer. Also, the authors have studied the impact of imperfect debugging on the optimal release time, warranty policy and development cost which signifies that it is important for the developers to control the parameters that cause a sharp increase in cost.

Design/methodology/approach

An optimization problem is formulated to minimize software development cost by considering imperfect fault removal process, faults generation at a constant rate and an environmental factor to differentiate the operational phase from the testing phase. Another optimization problem under perfect debugging conditions, i.e. without error generation is constructed for comparison. These optimization models are solved in MATLAB, and their solutions provide insights to the degree of impact of imperfect debugging on the optimal policies with respect to software release time and warranty time.

Findings

A real-life fault data set of Radar System is used to study the impact of various cost factors via sensitivity analysis on release and warranty policy. If firms tend to provide warranty for a longer period of time, then they may have to bear losses due to increased debugging cost with more number of failures occurring during the warrantied time but if the warranty is not provided for sufficient time it may not act as sufficient hedge during field failures.

Originality/value

Every firm is fighting to remain in the competition and expand market share by offering the latest technology-based products, using innovative marketing strategies. Warranty is one such strategic tool to promote the product among masses and develop a sense of quality in the user’s mind. In this paper, the failures encountered during development and after software release are considered to model the failure process.

Details

International Journal of Quality & Reliability Management, vol. 37 no. 9/10
Type: Research Article
ISSN: 0265-671X

Keywords

1 – 10 of 15